Bayesian Inference Featuring Entropic Priors

نویسنده

  • Tilman Neumann
چکیده

The subject of this work is the parametric inference problem, i.e. how to infer from data on the parameters of the data likelihood of a random process whose parametric form is known a priori. The assumption that Bayes’ theorem has to be used to add new data samples reduces the problem to the question of how to specify a prior before having seen any data. For this subproblem three theorems are stated. The first one is that Jaynes’ Maximum Entropy Principle requires at least a constraint on the expected data likelihood entropy, which gives entropic priors without the need of further axioms. Second I show that maximizing Shannon entropy under an expected data likelihood entropy constraint is equivalent to maximizing relative entropy and therefore reparametrization invariant for continuous-valued data likelihoods. Third, I propose that in the state of absolute ignorance of the data likelihood entropy, one should choose the hyperparameter α of an entropic prior such that the change of expected data likelihood entropy is maximized. Among other beautiful properties, this principle is equivalent to the maximization of the mean-squared entropy error and invariant against any reparametrizations of the data likelihood. Altogether we get a Bayesian inference procedure that incorporates special prior knowledge if available but has also a sound solution if not, and leaves no hyperparameters unspecified.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Consistency of Sequence Classification with Entropic Priors

Entropic priors, recently revisited within the context of theoretical physics, were originally introduced for image processing and for general statistical inference. Entropic priors seem to represent a very promising approach to “objective” prior determination when such information is not available. The attention has been mostly limited to continuous parameter spaces and our focus in this work ...

متن کامل

Entropic Priors for Discrete Probabilistic Networks and for Mixtures of Gaussians Models

The ongoing unprecedented exponential explosion of available computing power, has radically transformed the methods of statistical inference. What used to be a small minority of statisticians advocating for the use of priors and a strict adherence to bayes theorem, it is now becoming the norm across disciplines. The evolutionary direction is now clear. The trend is towards more realistic, flexi...

متن کامل

Bayesian Inference for Spatial Beta Generalized Linear Mixed Models

In some applications, the response variable assumes values in the unit interval. The standard linear regression model is not appropriate for modelling this type of data because the normality assumption is not met. Alternatively, the beta regression model has been introduced to analyze such observations. A beta distribution represents a flexible density family on (0, 1) interval that covers symm...

متن کامل

Maximum Entropy, Fluctuations and Priors

The method of maximum entropy (ME) is extended to address the following problem: Once one accepts that the ME distribution is to be preferred over all others, the question is to what extent are distributions with lower entropy supposed to be ruled out. Two applications are given. The first is to the theory of thermodynamic fluctuations. The formulation is exact, covariant under changes of coord...

متن کامل

Source Localization by Entropic Inference and Backward Renormalization Group Priors

A systematic method of transferring information from coarser to finer resolution based on renormalization group (RG) transformations is introduced. It permits building informative priors in finer scales from posteriors in coarser scales since, under some conditions, RG transformations in the space of hyperparameters can be inverted. These priors are updated using renormalized data into posterio...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007